Telegram Group & Telegram Channel
๐Ÿ’  Compositional Learning Journal Club

Join us this week for an in-depth discussion on Compositional Learning in the context of cutting-edge text-to-image generative models. We will explore recent breakthroughs and challenges, focusing on how these models handle compositional tasks and where improvements can be made.

โœ… This Week's Presentation:

๐Ÿ”น Title: Counting Understanding in Visoin Lanugate Models

๐Ÿ”ธ Presenter: Arash Marioriyad

๐ŸŒ€ Abstract:
Counting-related challenges represent some of the most significant compositional understanding failure modes in vision-language models (VLMs) such as CLIP. While humans, even in early stages of development, readily generalize over numerical concepts, these models often struggle to accurately interpret numbers beyond three, with the difficulty intensifying as the numerical value increases. In this presentation, we explore the counting-related limitations of VLMs and examine the proposed solutions within the field to address these issues.

๐Ÿ“„ Papers:
- Teaching CLIP to Count to Ten (ICCV, 2023)
- CLIP-Count: Towards Text-Guided Zero-Shot Object Counting (ACM-MM, 2023)


Session Details:
- ๐Ÿ“… Date: Sunday
- ๐Ÿ•’ Time: 5:00 - 6:00 PM
- ๐ŸŒ Location: Online at vc.sharif.edu/ch/rohban


We look forward to your participation! โœŒ๏ธ



tg-me.com/RIMLLab/146
Create:
Last Update:

๐Ÿ’  Compositional Learning Journal Club

Join us this week for an in-depth discussion on Compositional Learning in the context of cutting-edge text-to-image generative models. We will explore recent breakthroughs and challenges, focusing on how these models handle compositional tasks and where improvements can be made.

โœ… This Week's Presentation:

๐Ÿ”น Title: Counting Understanding in Visoin Lanugate Models

๐Ÿ”ธ Presenter: Arash Marioriyad

๐ŸŒ€ Abstract:
Counting-related challenges represent some of the most significant compositional understanding failure modes in vision-language models (VLMs) such as CLIP. While humans, even in early stages of development, readily generalize over numerical concepts, these models often struggle to accurately interpret numbers beyond three, with the difficulty intensifying as the numerical value increases. In this presentation, we explore the counting-related limitations of VLMs and examine the proposed solutions within the field to address these issues.

๐Ÿ“„ Papers:
- Teaching CLIP to Count to Ten (ICCV, 2023)
- CLIP-Count: Towards Text-Guided Zero-Shot Object Counting (ACM-MM, 2023)


Session Details:
- ๐Ÿ“… Date: Sunday
- ๐Ÿ•’ Time: 5:00 - 6:00 PM
- ๐ŸŒ Location: Online at vc.sharif.edu/ch/rohban


We look forward to your participation! โœŒ๏ธ

BY RIML Lab


Warning: Undefined variable $i in /var/www/tg-me/post.php on line 283

Share with your friend now:
tg-me.com/RIMLLab/146

View MORE
Open in Telegram


RIML Lab Telegram | DID YOU KNOW?

Date: |

Telegram Auto-Delete Messages in Any Chat

Some messages arenโ€™t supposed to last forever. There are some Telegram groups and conversations where itโ€™s best if messages are automatically deleted in a day or a week. Hereโ€™s how to auto-delete messages in any Telegram chat. You can enable the auto-delete feature on a per-chat basis. It works for both one-on-one conversations and group chats. Previously, you needed to use the Secret Chat feature to automatically delete messages after a set time. At the time of writing, you can choose to automatically delete messages after a day or a week. Telegram starts the timer once they are sent, not after they are read. This wonโ€™t affect the messages that were sent before enabling the feature.

Tata Power whose core business is to generate, transmit and distribute electricity has made no money to investors in the last one decade. That is a big blunder considering it is one of the largest power generation companies in the country. One of the reasons is the company's huge debt levels which stood at โ‚น43,559 crore at the end of March 2021 compared to the companyโ€™s market capitalisation of โ‚น44,447 crore.

RIML Lab from ye


Telegram RIML Lab
FROM USA